Important Properties of Normalized Kl Divergence under the Hmc Model
نویسنده
چکیده
Normalized Kullback-Leibler (KL) divergence between cover and stego objects was shown to be important quantity for proofs around Square Root Law. In these proofs, we need to find Taylor expansion of this function w.r.t. change-rate β around β = 0 or be able to find an upper bound for some derivatives of this function. We can expect that this can be done since normalized KL divergence should be arbitrarily smooth in β. In this report, we show that every derivative of this function is uniformly upper-bounded and Lipschitz-continuous w.r.t. β and thus Taylor expansion of any order can be done. 1. Report in a nutshell If you are reading this report just because you need its main result, you can only read this section. If cover image is modeled as Markov Chain and embedding operation is mutually independent (LSB, ±1) and done with change-rate β ∈ [0, βMAX ] (we call this HMC model), then we define normalized KL divergence between nelement cover distribution P (n) and n-element stego distribution Q (n) β as (1.1) 1 n dn(β) = 1 n DKL ( P (n)||Q β ) . Main result of this report, formally stated in Theorem 3, tells us that every derivative of dn(β)/n w.r.t. β (and function dn(β)/n itself) is uniformly bounded and Lipschitz-continuous (or simply continuous) on [0, β0]. These properties are independent of n ≥ 1. 2. Introduction Normalized KL divergence between n-element cover and stego distributions, defined by equation (1.1), was shown to be a key quantity for studying and proving Square Root Law (SRL). Although it is not hard to believe, that this function and its derivatives are well behaved functions (bounded and continuous), the proofs are somewhat lengthy and not inspirational and thus they are presented in this report, separate from other and more important results. In the rest of this section, we describe the notation used in this report. Section 3 gives a summary of the assumptions we are using and derives some of their basic consequences. Main result of this report is formulated in Section 4 and proved in Section 5. All necessary results from the theory of hidden Markov chains are presented in Appendix A. In addition to the notation developed before, we use the following symbols. We [notation]
منابع مشابه
Optimism in Reinforcement Learning Based on Kullback-Leibler Divergence
We consider model-based reinforcement learning in finite Markov Decision Processes (MDPs), focussing on so-called optimistic strategies. Optimism is usually implemented by carrying out extended value iterations, under a constraint of consistency with the estimated model transition probabilities. In this paper, we strongly argue in favor of using the Kullback-Leibler (KL) divergence for this pur...
متن کاملKL realignment for speaker diarization with multiple feature streams
This paper aims at investigating the use of Kullback-Leibler (KL) divergence based realignment with application to speaker diarization. The use of KL divergence based realignment operates directly on the speaker posterior distribution estimates and is compared with traditional realignment performed using HMM/GMM system. We hypothesize that using posterior estimates to re-align speaker boundarie...
متن کاملInference by Minimizing Size, Divergence, or their Sum
We speed up marginal inference by ignoring factors that do not significantly contribute to overall accuracy. In order to pick a suitable subset of factors to ignore, we propose three schemes: minimizing the number of model factors under a bound on the KL divergence between pruned and full models; minimizing the KL divergence under a bound on factor count; and minimizing the weighted sum of KL d...
متن کاملType 1 and 2 mixtures of Kullback-Leibler divergences as cost functions in dimensionality reduction based on similarity preservation
Stochastic neighbor embedding (SNE) and its variants are methods of dimensionality reduction (DR) that involve normalized softmax similarities derived from pairwise distances. These methods try to reproduce in the low-dimensional embedding space the similarities observed in the high-dimensional data space. Their outstanding experimental results, compared to previous state-of-the-art methods, or...
متن کاملEntropy Based Analysis of Delayed Voltage Recovery
In this work, Fault Induced Delayed Voltage Recovery (FIDVR) phenomenon has been characterized as the loss of Entropy in the voltage time series. The relationship between entropy and the voltage recovery rate has been established. It is shown that, the entropy can be an estimate of the recovery rate. Furthermore, the entropy computation has been done entirely from the statistical properties of ...
متن کامل